Finding Low-rank Solutions to Matrix Problems, Efficiently and Provably

نویسندگان

  • Dohyung Park
  • Anastasios Kyrillidis
  • Constantine Caramanis
  • Sujay Sanghavi
چکیده

A rank-r matrix X ∈ Rm×n can be written as a product UV >, where U ∈ Rm×r andV ∈ Rn×r. One could exploit this observation in optimization: e.g., consider the minimizationof a convex function f(X) over rank-r matrices, where the scaffold of rank-r matrices is modeledvia the factorization in U and V variables. Such heuristic has been widely used before forspecific problem instances, where the solution sought is (approximately) low-rank. Though suchparameterization reduces the number of variables and is more efficient in computational speedand memory requirement (of particular interest is the case r min{m,n}), it comes at a cost:f(UV >) becomes a non-convex function w.r.t. U and V .In this paper, we study such parameterization in optimization of generic convex f and focuson first-order, gradient descent algorithmic solutions. We propose an algorithm we call theBi-Factored Gradient Descent (BFGD) algorithm, an efficient first-order method that operateson the U, V factors. We show that when f is smooth, BFGD has local sublinear convergence,and linear convergence when f is both smooth and strongly convex. Moreover, for several keyapplications, we provide simple and efficient initialization schemes that provide approximatesolutions good enough for the above convergence results to hold.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sketchy Decisions: Convex Low-Rank Matrix Optimization with Optimal Storage

This paper concerns a fundamental class of convex matrix optimization problems. It presents the first algorithm that uses optimal storage and provably computes a lowrank approximation of a solution. In particular, when all solutions have low rank, the algorithm converges to a solution. This algorithm, SketchyCGM, modifies a standard convex optimization scheme, the conditional gradient method, t...

متن کامل

Dimensionality Reduction for Sparse and Structured Matrices

Dimensionality reduction has become a critical tool for quickly solving massive matrix problems. Especially in modern data analysis and machine learning applications, an overabundance of data features or examples can make it impossible to apply standard algorithms efficiently. To address this issue, it is often possible to distill data to a much smaller set of informative features or examples, ...

متن کامل

On the Provable Convergence of Alternating Minimization for Matrix Completion

Alternating Minimization is a widely used and empirically successful framework for Matrix Completion and related low-rank optimization problems. We give a new algorithm based on Alternating Minimization that provably recovers an unknown low-rank matrix from a random subsample of its entries under a standard incoherence assumption while achieving a linear convergence rate. Compared to previous w...

متن کامل

Static and Dynamic Robust PCA via Low-Rank + Sparse Matrix Decomposition: A Review

Principal Components Analysis (PCA) is one of the most widely used dimension reduction techniques. Robust PCA (RPCA) refers to the problem of PCA when the data may be corrupted by outliers. Recent work by Candes, Wright, Li, and Ma defined RPCA as a problem of decomposing a given data matrix into the sum of a low-rank matrix (true data) and a sparse matrix (outliers). The column space of the lo...

متن کامل

Finding Low-rank Solutions of Sparse Linear Matrix Inequalities using Convex Optimization

This paper is concerned with the problem of finding a low-rank solution of an arbitrary sparse linear matrix inequality (LMI). To this end, we map the sparsity of the LMI problem into a graph. We develop a theory relating the rank of the minimum-rank solution of the LMI problem to the sparsity of its underlying graph. Furthermore, we propose three graph-theoretic convex programs to obtain a low...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1606.03168  شماره 

صفحات  -

تاریخ انتشار 2016